575 research outputs found
Approximation Algorithms for Complex-Valued Ising Models on Bounded Degree Graphs
We study the problem of approximating the Ising model partition function with
complex parameters on bounded degree graphs. We establish a deterministic
polynomial-time approximation scheme for the partition function when the
interactions and external fields are absolutely bounded close to zero.
Furthermore, we prove that for this class of Ising models the partition
function does not vanish. Our algorithm is based on an approach due to Barvinok
for approximating evaluations of a polynomial based on the location of the
complex zeros and a technique due to Patel and Regts for efficiently computing
the leading coefficients of graph polynomials on bounded degree graphs.
Finally, we show how our algorithm can be extended to approximate certain
output probability amplitudes of quantum circuits.Comment: 12 pages, 0 figures, published versio
Commuting Quantum Circuits with Few Outputs are Unlikely to be Classically Simulatable
We study the classical simulatability of commuting quantum circuits with n
input qubits and O(log n) output qubits, where a quantum circuit is classically
simulatable if its output probability distribution can be sampled up to an
exponentially small additive error in classical polynomial time. First, we show
that there exists a commuting quantum circuit that is not classically
simulatable unless the polynomial hierarchy collapses to the third level. This
is the first formal evidence that a commuting quantum circuit is not
classically simulatable even when the number of output qubits is exponentially
small. Then, we consider a generalized version of the circuit and clarify the
condition under which it is classically simulatable. Lastly, we apply the
argument for the above evidence to Clifford circuits in a similar setting and
provide evidence that such a circuit augmented by a depth-1 non-Clifford layer
is not classically simulatable. These results reveal subtle differences between
quantum and classical computation.Comment: 19 pages, 6 figures; v2: Theorems 1 and 3 improved, proofs modifie
Achieving quantum supremacy with sparse and noisy commuting quantum computations
The class of commuting quantum circuits known as IQP (instantaneous quantum
polynomial-time) has been shown to be hard to simulate classically, assuming
certain complexity-theoretic conjectures. Here we study the power of IQP
circuits in the presence of physically motivated constraints. First, we show
that there is a family of sparse IQP circuits that can be implemented on a
square lattice of n qubits in depth O(sqrt(n) log n), and which is likely hard
to simulate classically. Next, we show that, if an arbitrarily small constant
amount of noise is applied to each qubit at the end of any IQP circuit whose
output probability distribution is sufficiently anticoncentrated, there is a
polynomial-time classical algorithm that simulates sampling from the resulting
distribution, up to constant accuracy in total variation distance. However, we
show that purely classical error-correction techniques can be used to design
IQP circuits which remain hard to simulate classically, even in the presence of
arbitrary amounts of noise of this form. These results demonstrate the
challenges faced by experiments designed to demonstrate quantum supremacy over
classical computation, and how these challenges can be overcome.Comment: 23 pages, 1 figure; v4: uses standard journal styl
Characterizing quantum supremacy in near-term devices
© 2018 Macmillan Publishers Limited, part of Springer Nature. All rights reserved. A critical question for quantum computing in the near future is whether quantum devices without error correction can perform a well-defined computational task beyond the capabilities of supercomputers. Such a demonstration of what is referred to as quantum supremacy requires a reliable evaluation of the resources required to solve tasks with classical approaches. Here, we propose the task of sampling from the output distribution of random quantum circuits as a demonstration of quantum supremacy. We extend previous results in computational complexity to argue that this sampling task must take exponential time in a classical computer. We introduce cross-entropy benchmarking to obtain the experimental fidelity of complex multiqubit dynamics. This can be estimated and extrapolated to give a success metric for a quantum supremacy demonstration. We study the computational cost of relevant classical algorithms and conclude that quantum supremacy can be achieved with circuits in a two-dimensional lattice of 7 à 7 qubits and around 40 clock cycles. This requires an error rate of around 0.5% for two-qubit gates (0.05% for one-qubit gates), and it would demonstrate the basic building blocks for a fault-tolerant quantum computer
The Born supremacy: quantum advantage and training of an Ising Born machine
The search for an application of near-term quantum devices is widespread.
Quantum Machine Learning is touted as a potential utilisation of such devices,
particularly those which are out of the reach of the simulation capabilities of
classical computers. In this work, we propose a generative Quantum Machine
Learning Model, called the Ising Born Machine (IBM), which we show cannot, in
the worst case, and up to suitable notions of error, be simulated efficiently
by a classical device. We also show this holds for all the circuit families
encountered during training. In particular, we explore quantum circuit learning
using non-universal circuits derived from Ising Model Hamiltonians, which are
implementable on near term quantum devices.
We propose two novel training methods for the IBM by utilising the Stein
Discrepancy and the Sinkhorn Divergence cost functions. We show numerically,
both using a simulator within Rigetti's Forest platform and on the Aspen-1 16Q
chip, that the cost functions we suggest outperform the more commonly used
Maximum Mean Discrepancy (MMD) for differentiable training. We also propose an
improvement to the MMD by proposing a novel utilisation of quantum kernels
which we demonstrate provides improvements over its classical counterpart. We
discuss the potential of these methods to learn `hard' quantum distributions, a
feat which would demonstrate the advantage of quantum over classical computers,
and provide the first formal definitions for what we call `Quantum Learning
Supremacy'. Finally, we propose a novel view on the area of quantum circuit
compilation by using the IBM to `mimic' target quantum circuits using classical
output data only.Comment: v3 : Close to journal published version - significant text structure
change, split into main text & appendices. See v2 for unsplit version; v2 :
Typos corrected, figures altered slightly; v1 : 68 pages, 39 Figures.
Comments welcome. Implementation at
https://github.com/BrianCoyle/IsingBornMachin
Fluoxetine: a case history of its discovery and preclinical development
Introduction: Depression is a multifactorial mood disorder with a high prevalence worldwide. Until now, treatments for depression have focused on the inhibition of monoaminergic reuptake sites, which augment the bioavailability of monoamines in the CNS. Advances in drug discovery have widened the therapeutic options with the synthesis of so-called selective serotonin reuptake inhibitors (SSRIs), such as fluoxetine.
Areas covered: The aim of this case history is to describe and discuss the pharmacokinetic and pharmacodynamic profiles of fluoxetine, including its acute effects and the adaptive changes induced after long-term treatment.
Furthermore, the authors review the effect of fluoxetine on neuroplasticity and adult neurogenesis. In addition, the article summarises the preclinical behavioural data available on fluoxetineâs effects on depressive-like behaviour,
anxiety and cognition as well as its effects on other diseases. Finally, the article describes the seminal studies validating the antidepressant effects of fluoxetine.
Expert opinion: Fluoxetine is the first selective SSRI that has a recognised clinical efficacy and safety profile. Since its discovery, other molecules that mimic its mechanism of action have been developed, commencing a new
age in the treatment of depression. Fluoxetine has also demonstrated utility in the treatment of other disorders for which its prescription has now been approved
Pixel and Voxel Representations of Graphs
We study contact representations for graphs, which we call pixel
representations in 2D and voxel representations in 3D. Our representations are
based on the unit square grid whose cells we call pixels in 2D and voxels in
3D. Two pixels are adjacent if they share an edge, two voxels if they share a
face. We call a connected set of pixels or voxels a blob. Given a graph, we
represent its vertices by disjoint blobs such that two blobs contain adjacent
pixels or voxels if and only if the corresponding vertices are adjacent. We are
interested in the size of a representation, which is the number of pixels or
voxels it consists of.
We first show that finding minimum-size representations is NP-complete. Then,
we bound representation sizes needed for certain graph classes. In 2D, we show
that, for -outerplanar graphs with vertices, pixels are
always sufficient and sometimes necessary. In particular, outerplanar graphs
can be represented with a linear number of pixels, whereas general planar
graphs sometimes need a quadratic number. In 3D, voxels are
always sufficient and sometimes necessary for any -vertex graph. We improve
this bound to for graphs of treewidth and to
for graphs of genus . In particular, planar graphs
admit representations with voxels
Realisation of a programmable two-qubit quantum processor
The universal quantum computer is a device capable of simulating any physical
system and represents a major goal for the field of quantum information
science. Algorithms performed on such a device are predicted to offer
significant gains for some important computational tasks. In the context of
quantum information, "universal" refers to the ability to perform arbitrary
unitary transformations in the system's computational space. The combination of
arbitrary single-quantum-bit (qubit) gates with an entangling two-qubit gate is
a gate set capable of achieving universal control of any number of qubits,
provided that these gates can be performed repeatedly and between arbitrary
pairs of qubits. Although gate sets have been demonstrated in several
technologies, they have as yet been tailored toward specific tasks, forming a
small subset of all unitary operators. Here we demonstrate a programmable
quantum processor that realises arbitrary unitary transformations on two
qubits, which are stored in trapped atomic ions. Using quantum state and
process tomography, we characterise the fidelity of our implementation for 160
randomly chosen operations. This universal control is equivalent to simulating
any pairwise interaction between spin-1/2 systems. A programmable multi-qubit
register could form a core component of a large-scale quantum processor, and
the methods used here are suitable for such a device.Comment: 7 pages, 4 figure
Understanding the threats posed by non-native species: public vs. conservation managers.
Public perception is a key factor influencing current conservation policy. Therefore, it is important to determine the influence of the public, end-users and scientists on the prioritisation of conservation issues and the direct implications for policy makers. Here, we assessed public attitudes and the perception of conservation managers to five non-native species in the UK, with these supplemented by those of an ecosystem user, freshwater anglers. We found that threat perception was not influenced by the volume of scientific research or by the actual threats posed by the specific non-native species. Media interest also reflected public perception and vice versa. Anglers were most concerned with perceived threats to their recreational activities but their concerns did not correspond to the greatest demonstrated ecological threat. The perception of conservation managers was an amalgamation of public and angler opinions but was mismatched to quantified ecological risks of the species. As this suggests that invasive species management in the UK is vulnerable to a knowledge gap, researchers must consider the intrinsic characteristics of their study species to determine whether raising public perception will be effective. The case study of the topmouth gudgeon Pseudorasbora parva reveals that media pressure and political debate has greater capacity to ignite policy changes and impact studies on non-native species than scientific evidence alone
No imminent quantum supremacy by boson sampling
It is predicted that quantum computers will dramatically outperform their
conventional counterparts. However, large-scale universal quantum computers are
yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to
the platform of photons in linear optics, which has sparked interest as a rapid
way to demonstrate this quantum supremacy. Photon statistics are governed by
intractable matrix functions known as permanents, which suggests that sampling
from the distribution obtained by injecting photons into a linear-optical
network could be solved more quickly by a photonic experiment than by a
classical computer. The contrast between the apparently awesome challenge faced
by any classical sampling algorithm and the apparently near-term experimental
resources required for a large boson sampling experiment has raised
expectations that quantum supremacy by boson sampling is on the horizon. Here
we present classical boson sampling algorithms and theoretical analyses of
prospects for scaling boson sampling experiments, showing that near-term
quantum supremacy via boson sampling is unlikely. While the largest boson
sampling experiments reported so far are with 5 photons, our classical
algorithm, based on Metropolised independence sampling (MIS), allowed the boson
sampling problem to be solved for 30 photons with standard computing hardware.
We argue that the impact of experimental photon losses means that demonstrating
quantum supremacy by boson sampling would require a step change in technology.Comment: 25 pages, 9 figures. Comments welcom
- âŠ